Lifted Relational Neural Networks

نویسندگان

  • Gustav Sourek
  • Vojtech Aschenbrenner
  • Filip Zelezný
  • Ondrej Kuzelka
چکیده

We propose a method combining relational-logic representations with deep neural network learning. Domain-specific knowledge is described through relational rules which may be handcrafted or learned. The relational rule-set serves as a template for unfolding possibly deep neural networks whose structures also reflect the structure of given training or testing examples. Different networks corresponding to different examples share their weights, which co-evolve during training by stochastic gradient descend algorithm. Notable relational concepts can be discovered by interpreting shared hidden layer weights corresponding to the rules. Experiments on 78 relational learning benchmarks demonstrate the favorable performance of the method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Predictive Categories Using Lifted Relational Neural Networks

Lifted relational neural networks (LRNNs) are a flexible neuralsymbolic framework based on the idea of lifted modelling. In this paper we show how LRNNs can be easily used to declaratively specify and solve a learning problem in which latent categories of entities and properties need to be jointly induced.

متن کامل

Stacked Structure Learning for Lifted Relational Neural Networks

Lifted Relational Neural Networks (LRNNs) describe relational domains using weighted firstorder rules which act as templates for constructing feed-forward neural networks. While previous work has shown that using LRNNs can lead to state-of-the-art results in various ILP tasks, these results depended on hand-crafted rules. In this paper, we extend the framework of LRNNs with structure learning, ...

متن کامل

Tractable Learning of Liftable Markov Logic Networks

Markov logic networks (MLNs) are a popular statistical relational learning formalism that combine Markov networks with first-order logic. Unfortunately, inference and maximum-likelihood learning with MLNs is highly intractable. For inference, this problem is addressed by lifted algorithms, which speed up inference by exploiting symmetries. State-of-the-art lifted algorithms give tractability gu...

متن کامل

Learning and Exploiting Relational Structure for Efficient Inference

Learning and Exploiting Relational Structure for Efficient Inference Aniruddh Nath Chair of the Supervisory Committee: Professor Pedro Domingos Computer Science & Engineering One of the central challenges of statistical relational learning is the tradeoff between expressiveness and computational tractability. Representations such as Markov logic can capture rich joint probabilistic models over ...

متن کامل

A Sound and Complete Algorithm for Learning Causal Models from Relational Data

The PC algorithm learns maximally oriented causal Bayesian networks. However, there is no equivalent complete algorithm for learning the structure of relational models, a more expressive generalization of Bayesian networks. Recent developments in the theory and representation of relational models support lifted reasoning about conditional independence. This enables a powerful constraint for ori...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1508.05128  شماره 

صفحات  -

تاریخ انتشار 2015